Despite the fact that different objects possess distinct class-specificfeatures, they also usually share common patterns. This observation has beenexploited partially in a recently proposed dictionary learning framework byseparating the particularity and the commonality (COPAR). Inspired by this, wepropose a novel method to explicitly and simultaneously learn a set of commonpatterns as well as class-specific features for classification with moreintuitive constraints. Our dictionary learning framework is hence characterizedby both a shared dictionary and particular (class-specific) dictionaries. Forthe shared dictionary, we enforce a low-rank constraint, i.e. claim that itsspanning subspace should have low dimension and the coefficients correspondingto this dictionary should be similar. For the particular dictionaries, weimpose on them the well-known constraints stated in the Fisher discriminationdictionary learning (FDDL). Further, we develop new fast and accuratealgorithms to solve the subproblems in the learning step, accelerating itsconvergence. The said algorithms could also be applied to FDDL and itsextensions. The efficiencies of these algorithms are theoretically andexperimentally verified by comparing their complexities and running time withthose of other well-known dictionary learning methods. Experimental results onwidely used image datasets establish the advantages of our method overstate-of-the-art dictionary learning methods.
展开▼